Web Survey Bibliography
Traditionally, business data for official statistics have been collected with paper questionnaires in self administrative surveys. Nowadays the paper questionnaire is more and more replaced by web questionnaires. A variety of strategies can be followed to introduce the web in business surveys. In Norway in 2004, and recently in Denmark, it has been decided that all business surveys should be transformed to the web quickly. In the Netherlands an effort has been made to develop a well-designed web questionnaire: the Structural Business Survey questionnaire was fully designed and pre-tested in a two-year period. The result was supposed to serve as an example for all other surveys. A driving force behind this development is a common interest by the surveyors and those who are surveyed to reduce the manpower, and hence the costs of business surveys (including response burden).
But neither these ambitions nor quality improvements come automatically with technological innovations. At international conferences, workshops and meetings, we find that many methodologists are struggling with the implementation of these technologies. In February 2010, methodologist from 8 European countries met in Copenhagen to discuss how common EU-regulated surveys best can be transferred from paper to web (both for business and social surveys; the focus was on business surveys). The idea for this meeting was born when data collection methodologists from Statistics Denmark visited Statistics Netherlands in May 2009 to discuss web questionnaire designs. The initiative to organise this meeting was taken at the 2009 ISM Workshop in Bergamo. In follow-up to the Copenhagen meeting, this topic was also at the agenda of the Eurostat Working Group of Statistical Quality in June 2010. Here it was decided to discuss the need for an action plan and concrete projects with the Directors of Methodology.
This 2011 ISM presentation is a follow-up of the Copenhagen initiative, and is meant to report back to the participants what has been done. In the presentation we will give an overview of the issues that have been discussed, and relate those to non-sampling errors like non-response and measurement issues, as well as response burden. We would like to discuss with the audience how the Copenhagen initiative and the issues raised best could be followed up.
Issues that have been discussed (and which have relations to other presentations in the Workshop) are:
– An issue that is discussed over and over again is how to get sampled units pick-up the web questionnaire: What strategies should be used to increase the take-up rate? Should a paper questionnaire still be available, and presented?
– Where are we when it comes to guidelines in how to design web questionnaires? One much discussed issue under this headline is how similar or different web and paper questionnaires in a mixed mode data collection design should be. When respondents use a web questionnaire, they expect it to have some intelligence. What do respondents expect and what guidelines can be given to make the questionnaire respondent friendly? Issues here are e.g. the use of matrix questions and edit checks that help to get good data quality but may also result in aborting the completion of the questionnaire. Another issue is the use of historic data in the questionnaire (comparable to dependent interviewing).
– Talking about mixed-mode designs: How to deal with mode effects?
– Technology issues include e.g. how to deal with the variety of software browsers?
– How to implement web questionnaires? When moving to the web, Statistics Netherlands on the one side, and Statistics Norway and Statistics Denmark on the other, adopted different approaches (as discussed above). What did we learn from these two approaches?
– Once a questionnaire has been developed, the issue is: How do new methods affect pre-tests and the ability to monitor the response process?
During the development of web questionnaires traditional cognitive interviewing and the techniques of usability studies can be combined, e.g. by using paradata and eye-tracking during individual tests. Computerized questionnaires also opens the possibility to monitor the response process in a detailed way while conducting the survey (using paradata), both at the level of overall response rates as well at the level of individual respondents (using audit trails).
In our presentation we will focus in some more detail on web pick-up issues and response burden issues.
Issues that have not yet been addressed, but which are important and can be discussed at the workshop, are:
– How to organising the data collection and logistics for web and mixed-mode surveys?
– What software should be used: develop ones own software or use software that is available in the market?
– How to organise research, collaborate with universities, and bring in the literature?
Workshop Homepage (abstract) / (presentation)
Web survey bibliography (388)
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.
- Improving survey response rates: The effect of embedded questions in web survey email Invitations; 2017; Liu, M.; Inchausti, N.
- Enhancing survey participation: Facebook advertisements for recruitment in educational research; 2017; Forgasz, H.; Tan, H.; Leder, G.; McLeod, A.
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- “Better do not touch” and other superstitions concerning melanoma: the cross-sectional web...; 2016; Gajda, M.; Kamińska-Winciorek, G.; Wydmański, J.; Tukiendorf, A.
- Targeted Appeals for Participation in Letters to Panel Survey Members; 2016; Lynn, P.
- Population Survey Features and Response Rates: A Randomized Experiment; 2016; Guo, Y.; Kopec, J.; Cibere, J.; Li, L. C.; Goldsmith, C. H.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- Can Student Populations in Developing Countries Be Reached by Online Surveys? The Case of the National...; 2016; Langer, A., Meuleman, B., Oshodi, A.-G. T., Schroyens, M.
- How to maximize survey response rates ; 2016; DeVall, R.; Colby, C.
- Impact of Field Period Length and Contact Attempts on Representativeness for Web Survey ; 2016; Bertoni, N.; Turakhia, C.; Magaw, R.; Ackermann, A.
- Have You Taken Your Survey Yet? Optimum Interval for Reminders in Web Panel Surveys ; 2016; Kanitkar, K. N.; Liu, D.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Effects of an Initial Offering of Multiple Survey Response Options on Response Rates; 2016; Steele, E. A.; Marlar, J.; Allen, L.; Kanitkar, K. N.
- How to Invite? Methods for Increasing Internet Surv ey Response Rate ; 2016; Huang, A. R.; Noel, H.; Hargraves, L.
- Reaching the Mobile Generation: Reducing Web Survey Non-response through SMS Reminders ; 2016; Kanitkar, K. N.; Marlar, J.
- "Don't be Afraid ... We're Researchers!": The Impact of Informal Contact Language...; 2016; Foster, K. N.; Hagemeier, N. E.; Alamain, A. A.; Pack, R.; Sevak, R. J.
- Does Embedding a Survey Question in the Survey Invi tation E-mail Affect Response Rates? Evidence from...; 2016; Vannette, D.
- Communication Channels that Predict and Mediate Self-response ; 2016; Walejko, G. K.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Non-Observation Bias in an Address-Register-Based CATI/CAPI Mixed Mode Survey; 2016; Lipps, O.
- Pre-Survey Text Messages (SMS) Improve Participation Rate in an Australian Mobile Telephone Survey:...; 2016; Dal Grande, E.; Chittleborough, C. R.; Campostrini, S.; Dollard, M.; Taylor, A. W.
- Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates; 2016; Trespalacios, J. H.; Perkins, R. A.
- Assessing targeted approach letters: effects in different modes on response rates, response speed and...; 2016; Lynn, P.
- Refining the Web Response Option in the Multiple Mode Collection of the American Community Survey; 2016; Hughes, T.; Tancreto, J.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- “Money Will Solve the Problem”: Testing the Effectiveness of Conditional Incentives for...; 2016; DeCamp, W.; Manierre, M. J.
- Effects of Incentive Amount and Type of Web Survey Response Rates; 2016; Coopersmith, J.; Vogel, L. K.; Bruursema, T.; Feeney, K.
- Effect of a Post-paid Incentive on Response to a Web-based Survey; 2016; Brown, J. A.; Serrato, C. A.; Hugh, M.; Kanter, M. H.; A.; Spritzer, K. L.; Hays, R. D.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Is One More Reminder Worth It? If So, Pick Up the Phone: Findings from a Web Survey; 2016; Lin-Freeman, L.
- Take the money and run? Redemption of a gift card incentive in a clinician survey. ; 2016; Chen, J. S.; Sprague, B. L.; Klabunde, C. N.; Tosteson, A. N. A.; Bitton, A.; Onega, T.; MacLean, C....
- The effect of email invitation elements on response rate in a web survey within an online community; 2016; Petrovcic, A.; Petric, G.; Lozar Manfreda, K.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Doing Surveys Online ; 2016; Toepoel, V.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Incentive Types and Amounts in a Web-based Survey of College Students; 2015; Krebs, C.; Planty, M.; Stroop, J.; Berzofsky, M.; Lindquist, C.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.